40 research outputs found

    Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis

    Get PDF
    Knowing the depth structure of the environment is crucial for moving animals in many behavioral contexts, such as collision avoidance, targeting objects, or spatial navigation. An important source of depth information is motion parallax. This powerful cue is generated on the eyes during translatory self-motion with the retinal images of nearby objects moving faster than those of distant ones. To investigate how the visual motion pathway represents motion-based depth information we analyzed its responses to image sequences recorded in natural cluttered environments with a wide range of depth structures. The analysis was done on the basis of an experimentally validated model of the visual motion pathway of insects, with its core elements being correlation-type elementary motion detectors (EMDs). It is the key result of our analysis that the absolute EMD responses, i.e. the motion energy profile, represent the contrast-weighted nearness of environmental structures during translatory self-motion at a roughly constant velocity. In other words, the output of the EMD array highlights contours of nearby objects. This conclusion is largely independent of the scale over which EMDs are spatially pooled and was corroborated by scrutinizing the motion energy profile after eliminating the depth structure from the natural image sequences. Hence, the well-established dependence of correlation-type EMDs on both velocity and textural properties of motion stimuli appears to be advantageous for representing behaviorally relevant information about the environment in a computationally parsimonious way

    Structural and Functional Evolution of the Trace Amine-Associated Receptors TAAR3, TAAR4 and TAAR5 in Primates

    Get PDF
    The family of trace amine-associated receptors (TAAR) comprises 9 mammalian TAAR subtypes, with intact gene and pseudogene numbers differing considerably even between closely related species. To date the best characterized subtype is TAAR1, which activates the Gs protein/adenylyl cyclase pathway upon stimulation by trace amines and psychoactive substances like MDMA or LSD. Recently, chemosensory function involving recognition of volatile amines was proposed for murine TAAR3, TAAR4 and TAAR5. Humans can smell volatile amines despite carrying open reading frame (ORF) disruptions in TAAR3 and TAAR4. Therefore, we set out to study the functional and structural evolution of these genes with a special focus on primates. Functional analyses showed that ligands activating the murine TAAR3, TAAR4 and TAAR5 do not activate intact primate and mammalian orthologs, although they evolve under purifying selection and hence must be functional. We also find little evidence for positive selection that could explain the functional differences between mouse and other mammals. Our findings rather suggest that the previously identified volatile amine TAAR3–5 agonists reflect the high agonist promiscuity of TAAR, and that the ligands driving purifying selection of these TAAR in mouse and other mammals still await discovery. More generally, our study points out how analyses in an evolutionary context can help to interpret functional data generated in single species

    Local motion adaptation enhances the representation of spatial structure at EMD arrays

    No full text
    <div><p>Neuronal representation and extraction of spatial information are essential for behavioral control. For flying insects, a plausible way to gain spatial information is to exploit distance-dependent optic flow that is generated during translational self-motion. Optic flow is computed by arrays of local motion detectors retinotopically arranged in the second neuropile layer of the insect visual system. These motion detectors have adaptive response characteristics, i.e. their responses to motion with a constant or only slowly changing velocity decrease, while their sensitivity to rapid velocity changes is maintained or even increases. We analyzed by a modeling approach how motion adaptation affects signal representation at the output of arrays of motion detectors during simulated flight in artificial and natural 3D environments. We focused on translational flight, because spatial information is only contained in the optic flow induced by translational locomotion. Indeed, flies, bees and other insects segregate their flight into relatively long intersaccadic translational flight sections interspersed with brief and rapid saccadic turns, presumably to maximize periods of translation (80% of the flight). With a novel adaptive model of the insect visual motion pathway we could show that the motion detector responses to background structures of cluttered environments are largely attenuated as a consequence of motion adaptation, while responses to foreground objects stay constant or even increase. This conclusion even holds under the dynamic flight conditions of insects.</p></div

    Direction-independent motion adaptation and contrast gain reduction.

    No full text
    <p>(A-C) Model responses (red) to 1 s of sine-wave grating motion before and after 4 s of motion adaptation (corresponding LPTC responses to the same type of stimulus, see Figure 2 and 5 in [<a href="http://www.ploscompbiol.org/article/info:doi/10.1371/journal.pcbi.1005919#pcbi.1005919.ref028" target="_blank">28</a>]). During the motion adaptation period the sine-wave grating with high contrast and velocity moved in (A) the preferred direction (PD), (B) the null-direction (ND), or (C) an orthogonal direction. (D) For the same stimulus scheme, the brightness contrast of the grating during the reference and test period was systematically varied, and contrast gain was assessed by calculating the normalized response for the first 300 ms of the reference and test period (solid line: contrast gain before motion adaptation, dotted and dashed lines: contrast gain after PD and ND adaptation, see Figure 2 in [<a href="http://www.ploscompbiol.org/article/info:doi/10.1371/journal.pcbi.1005919#pcbi.1005919.ref028" target="_blank">28</a>] for corresponding experimental data).</p

    Impact of motion adaptation on spatial vision for semi-natural flight dynamics.

    No full text
    <p>Schematic of the spatial layout of a 3D environment and the flight trajectory of an artificial agent with (A) only translational movement or (C) with semi-natural flight consisting of eight cycles of a decagonal trajectory. (B) The same side view as the agent passes by a bar shared between conditions (A) and (C) for all three wall distances tested (black: 0.55 m, green: 2 m, and red: 4 m distance between wall and trajectory). (D) Average motion energy at 90° azimuth over time for the wall distance of 2 m averaged over 50 different wall and bar patterns (same as <a href="http://www.ploscompbiol.org/article/info:doi/10.1371/journal.pcbi.1005919#pcbi.1005919.g006" target="_blank">Fig 6D</a>). (F) Average response contrast (over 50 different wall and bar patterns) between bar and background responses over time for all three wall distances (same as <a href="http://www.ploscompbiol.org/article/info:doi/10.1371/journal.pcbi.1005919#pcbi.1005919.g006" target="_blank">Fig 6H</a>). (E, G) The same analysis as in (D, F), however, under semi-natural flight conditions in the environment as illustrated in (C).</p

    Impact of motion adaptation on spatial vision during translation in an artificial 3D environment.

    No full text
    <p>(A) Schematic illustration of spatial layout of the artificial 3D environment and the flight trajectory of an artificial agent translating parallel to a row of bars and a wall behind the bars. (B) The projection of the environment on the left hemisphere of a spherical eye. (C) The EMD response profile before adaptation (as the first bar passing by, left sub-Figure) and after adaptation (as the eighth bar passing by, right sub-Figure). (D) Motion energy averaged across elevation at 90° azimuth as a function of time; the response to the background wall is shown in the inset on a finer scale (red and green: section of response used to assess peak responses to bars and background for the purpose of assessing response contrast). (E) Response contrast between bar and background responses during the passage of each of the eight bars.</p

    Enhancement of response contrast by motion adaptation over a wide range of stimulus parameters.

    No full text
    <p>(A) An example (corresponding to the condition marked by black frame in B) of model response to the same stimulus scheme as in <a href="http://www.ploscompbiol.org/article/info:doi/10.1371/journal.pcbi.1005919#pcbi.1005919.g002" target="_blank">Fig 2</a> (black), in which the peak response to the temporal frequency transient (red) and the response to the constant background temporal frequency (green) of the first and the last temporal frequency decrements were used to assess whether the response contrast to temporal frequency transients is enhanced by adaptation. (B) The changes of response contrast to temporal frequency transients (see <a href="http://www.ploscompbiol.org/article/info:doi/10.1371/journal.pcbi.1005919#pcbi.1005919.e005" target="_blank">Eq (5)</a>, red: enhancement and blue: reduction of response contrast with adaptation) assessed over a wide range of brightness contrasts of the sine-wave grating and the constant temporal frequencies (smaller plots: the same analysis for light conditions brighter by eight decades). (C, D) Same as (A, B), however, with transient temporal frequency increments rather than decrements superimposed on the background motion.</p

    Enhancement of motion detector response contrast with adaptation for different fore- and background depth differences.

    No full text
    <p>(A) Schematic of the spatial layout of a 3D environment and the flight trajectory of an artificial agent, the same environmental design as in <a href="http://www.ploscompbiol.org/article/info:doi/10.1371/journal.pcbi.1005919#pcbi.1005919.g005" target="_blank">Fig 5A</a>, however, for three different wall distances in different scenarios (black: wall distance 0.55 m, green: 2 m, red: 4 m). (B) The average motion energy across elevations at 90° azimuth over time as assessed in <a href="http://www.ploscompbiol.org/article/info:doi/10.1371/journal.pcbi.1005919#pcbi.1005919.g005" target="_blank">Fig 5D</a>, however, averaged over 50 different wall and bar patterns. (C) Response contrast between each bar and background response with adaptation. Results obtained from 50 different random wall and bar patterns summarized in box plots (mid-line: median; box: 25–75 percentile: red cross: outlier). (D-G) Same as (B, C), however, with wall distances of 2 m and 4 m, respectively. (H) Averaged response contrast between bar and background as a function of time for all three scenarios with different wall distances over 50 different random wall and bar patterns.</p

    A Bio-inspired Collision Avoidance Model Based on Spatial Information Derived from Motion Detectors Leads to Common Routes

    No full text
    <div><p>Avoiding collisions is one of the most basic needs of any mobile agent, both biological and technical, when searching around or aiming toward a goal. We propose a model of collision avoidance inspired by behavioral experiments on insects and by properties of optic flow on a spherical eye experienced during translation, and test the interaction of this model with goal-driven behavior. Insects, such as flies and bees, actively separate the rotational and translational optic flow components via behavior, i.e. by employing a saccadic strategy of flight and gaze control. Optic flow experienced during translation, i.e. during intersaccadic phases, contains information on the depth-structure of the environment, but this information is entangled with that on self-motion. Here, we propose a simple model to extract the depth structure from translational optic flow by using local properties of a spherical eye. On this basis, a motion direction of the agent is computed that ensures collision avoidance. Flying insects are thought to measure optic flow by correlation-type elementary motion detectors. Their responses depend, in addition to velocity, on the texture and contrast of objects and, thus, do not measure the velocity of objects veridically. Therefore, we initially used geometrically determined optic flow as input to a collision avoidance algorithm to show that depth information inferred from optic flow is sufficient to account for collision avoidance under closed-loop conditions. Then, the collision avoidance algorithm was tested with bio-inspired correlation-type elementary motion detectors in its input. Even then, the algorithm led successfully to collision avoidance and, in addition, replicated the characteristics of collision avoidance behavior of insects. Finally, the collision avoidance algorithm was combined with a goal direction and tested in cluttered environments. The simulated agent then showed goal-directed behavior reminiscent of components of the navigation behavior of insects.</p></div
    corecore